Semi-Orthogonal Multilinear PCA with Relaxed Start
نویسندگان
چکیده
Principal component analysis (PCA) is an unsupervised method for learning low-dimensional features with orthogonal projections. Multilinear PCA methods extend PCA to deal with multidimensional data (tensors) directly via tensor-to-tensor projection or tensor-to-vector projection (TVP). However, under the TVP setting, it is difficult to develop an effective multilinear PCA method with the orthogonality constraint. This paper tackles this problem by proposing a novel Semi-Orthogonal Multilinear PCA (SO-MPCA) approach. SO-MPCA learns low-dimensional features directly from tensors via TVP by imposing the orthogonality constraint in only one mode. This formulation results in more captured variance and more learned features than full orthogonality. For better generalization, we further introduce a relaxed start (RS) strategy to get SO-MPCA-RS by fixing the starting projection vectors, which increases the bias and reduces the variance of the learning model. Experiments on both face (2D) and gait (3D) data demonstrate that SO-MPCA-RS outperforms other competing algorithms on the whole, and the relaxed start strategy is also effective for other TVP-based PCA methods.
منابع مشابه
A Report on Multilinear PCA Plus Multilinear LDA to Deal with Tensorial Data: Visual Classification as An Example
In practical applications, we often have to deal with high order data, such as a grayscale image and a video sequence are intrinsically 2nd-order tensor and 3rd-order tensor, respectively. For doing clustering or classification of these high order data, it is a conventional way to vectorize these data before hand, as PCA or FDA does, which often induce the curse of dimensionality problem. For t...
متن کاملMultilinear Subspace Analysis of Image Ensembles
Multilinear algebra, the algebra of higher-order tensors, offers a potent mathematical framework for analyzing ensembles of images resulting from the interaction of any number of underlying factors. We present a dimensionality reduction algorithm that enables subspace analysis within the multilinear framework. This N -mode orthogonal iteration algorithm is based on a tensor decomposition known ...
متن کاملAbout Classification Methods Based on Tensor Modelling for Hyperspectral Images
Denoising and Dimensionality Reduction (DR) are key issue to improve the classifiers efficiency for Hyper spectral images (HSI). The multi-way Wiener filtering recently developed is used, Principal and independent component analysis (PCA; ICA) and projection pursuit (PP) approaches to DR have been investigated. These matrix algebra methods are applied on vectorized images. Thereof, the spatial ...
متن کاملBATCH/SEMI-BATCH PROCESS FAULT DETECTION AND DIAGNOSIS USING ORTHOGONAL NONLINEAR MULTI-WAY PCA: Application to an emulsion co-polymerization process
In this paper, a fault detection and diagnosis for batch/semi-batch processes by utilizing the PCA scores subspace is proposed. To develop the diagnosis model, first the multi-way unfolding is utilised to transform 3-dimensional batches data onto 2dimensional data. The process of extracting linear and nonlinear correlations from process data is performed by sequentially applying a linear PCA an...
متن کاملAn Orthogonal DLGE Algorithm With its Application to Face Recognition
Linear Graph Embedding (LGE) is the linearization of graph embedding, which could explain many of the popular dimensionality reduction algorithms such as LDA, LLE and LPP. LGE algorithms have been applied in many domains successfully; however, those algorithms need a PCA transform in advance to avoid a possible singular problem. Further, LGEs are non-orthogonal and this makes them difficult to ...
متن کامل